Sign rank, VC dimension and spectral gaps
نویسندگان
چکیده
We study the maximum possible sign rank of N×N sign matrices with a given VC dimension d. For d = 1, this maximum is 3. For d = 2, this maximum is Θ̃(N1/2). Similar (slightly less accurate) statements hold for d > 2 as well. We discuss the tightness of our methods, and describe connections to combinatorics, communication complexity and learning theory. We also provide explicit examples of matrices with low VC dimension and high sign rank. Let A be the N × N point-hyperplane incidence matrix of a finite projective geometry with order n ≥ 3 and dimension d ≥ 2. The VC dimension of A is d, and we prove that its sign rank is larger than N 1 2 − 1 2d . The large sign rank of A demonstrates yet another difference between finite and real geometries. To analyse the sign rank of A, we introduce a connection between sign rank and spectral gaps, which may be of independent interest. Consider the N × N adjacency matrix of a ∆ regular graph with a second eigenvalue in absolute value λ and ∆ ≤ N/2. We show that the sign rank of the signed version of this matrix is at least ∆/λ. A similar statement holds for all regular (not necessarily symmetric) sign matrices. We also describe limitations of this approach, in the spirit of the Alon-Boppana theorem.
منابع مشابه
Sign rank versus VC dimension
This work studies the maximum possible sign rank ofN×N sign matrices with a given VC dimension d. For d = 1, this maximum is three. For d = 2, this maximum is Θ̃(N1/2). For d > 2, similar but slightly less accurate statements hold. The lower bounds improve over previous ones by Ben-David et al., and the upper bounds are novel. The lower bounds are obtained by probabilistic constructions, using a...
متن کاملSign rank versus Vapnik-Chervonenkis dimension
This work studies the maximum possible sign rank of sign (N ×N)-matrices with a given Vapnik-Chervonenkis dimension d. For d = 1, this maximum is three. For d = 2, this maximum is Θ̃(N). For d > 2, similar but slightly less accurate statements hold. The lower bounds improve on previous ones by Ben-David et al., and the upper bounds are novel. The lower bounds are obtained by probabilistic constr...
متن کاملCOS 511 : Theoretical Machine Learning
The dot sign means inner product. If b is forced to be 0, the VC-dimension reduces to n. It is often the case that the VC-dimension is equal to the number of free parameters of a concept (for example, a rectangle’s parameters are its topmost, bottommost, leftmost and rightmost bounds, and its VC-dimension is 4). However, it is not always true; there exists concepts with 1 parameter but an infin...
متن کاملIdentification vs . Self - Verification in Virtual Communities ( VC ) : Theoretical Gaps and Design Implications
Identity-related processes have been identified as important in explaining virtual community (VC) member behavior as well as informing system design of VCs. In particular, the two distinct identity processes of self-verification and identification have been identified and investigated separately, portrayed as two distinctive or contradictory identity processes with different practical implicati...
متن کاملIntroduction to Machine Learning: Class Notes 67577
Introduction to Machine learning covering Statistical Inference (Bayes, EM, ML/MaxEnt duality), algebraic and spectral methods (PCA, LDA, CCA, Clustering), and PAC learning (the Formal model, VC dimension, Double Sampling theorem).
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید
ثبت ناماگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید
ورودعنوان ژورنال:
- Electronic Colloquium on Computational Complexity (ECCC)
دوره 21 شماره
صفحات -
تاریخ انتشار 2014